25 research outputs found

    Sparse visual models for biologically inspired sensorimotor control

    Get PDF
    Given the importance of using resources efficiently in the competition for survival, it is reasonable to think that natural evolution has discovered efficient cortical coding strategies for representing natural visual information. Sparse representations have intrinsic advantages in terms of fault-tolerance and low-power consumption potential, and can therefore be attractive for robot sensorimotor control with powerful dispositions for decision-making. Inspired by the mammalian brain and its visual ventral pathway, we present in this paper a hierarchical sparse coding network architecture that extracts visual features for use in sensorimotor control. Testing with natural images demonstrates that this sparse coding facilitates processing and learning in subsequent layers. Previous studies have shown how the responses of complex cells could be sparsely represented by a higher-order neural layer. Here we extend sparse coding in each network layer, showing that detailed modeling of earlier stages in the visual pathway enhances the characteristics of the receptive fields developed in subsequent stages. The yield network is more dynamic with richer and more biologically plausible input and output representation

    A biologically inspired computational model of the Block Copying Task

    Get PDF
    We present in this paper a biologically inspired model of the Basal Ganglia which deals with Block Copying as a sequence learning task. By breaking a relatively complex task into simpler operations with well-defined skills, an approach which is termed as a skill-based machine design is used in the device of computational models to complete such tasks. Basal Ganglia are critically involved in sensorimotor control. From the learning aspects, Actor-Critic architectures have been proposed to model the Basal Ganglia and Temporal difference has been proposed as a learning algorithm. The model is implemented and simulation results are presented to show the capability of our model to successfully complete the task

    Product Unit Learning

    Get PDF
    Product units provide a method of automatically learning the higher-order input combinations required for the efficient synthesis of Boolean logic functions by neural networks. Product units also have a higher information capacity than sigmoidal networks. However, this activation function has not received much attention in the literature. A possible reason for this is that one encounters some problems when using standard backpropagation to train networks containing these units. This report examines these problems, and evaluates the performance of three training algorithms on networks of this type. Empirical results indicate that the error surface of networks containing product units have more local minima than corresponding networks with summation units. For this reason, a combination of local and global training algorithms were found to provide the most reliable convergence. We then investigate how `hints' can be added to the training algorithm. By extracting a common frequency from the input weights, and training this frequency separately, we show that convergence can be accelerated. A constructive algorithm is then introduced which adds product units to a network as required by the problem. Simulations show that for the same problems this method creates a network with significantly less neurons than those constructed by the tiling and upstart algorithms. In order to compare their performance with other transfer functions, product units were implemented as candidate units in the Cascade Correlation (CC) \cite{Fahlman90} system. Using these candidate units resulted in smaller networks which trained faster than when the any of the standard (three sigmoidal types and one Gaussian) transfer functions were used. This superiority was confirmed when a pool of candidate units of four different nonlinear activation functions were used, which have to compete for addition to the network. Extensive simulations showed that for the problem of implementing random Boolean logic functions, product units are always chosen above any of the other transfer functions. (Also cross-referenced as UMIACS-TR-95-80

    Building Rectangular Floorplans–A Graph Theoretical Approach

    No full text

    A three-port adiabatic register file suitable for embedded applications

    No full text
    Adiabatic logic promises extremely low power consumption for those applications where slower clock rates are accept-able. However, there have been very few adiabatic memory designs, and any circuit of even moderate complexity re-quires some form of ram. This paper presents a register le implemented entirely with adiabatic logic, and fabricated using a 1.2m cmos technology. Comparison with a con-ventional cmos logic implementation, using both measured and simulated results, indicates signicant power savings have been realised.
    corecore